TDTO language modeling with feedforward neural networks

نویسندگان

  • Tze Yuang Chong
  • Rafael E. Banchs
  • Chng Eng Siong
  • Haizhou Li
چکیده

In this paper, we describe the use of feedforward neural networks to improve the term-distance term-occurrence (TDTO) language model, previously proposed in [1]−[3]. The main idea behind the TDTO model proposition is to model separately both position and occurrence information of words in the history-context to better estimate n-gram probabilities. Neural networks have been shown to offer a better generalization property than other conventional smoothing methods. We take advantage of such property for a better smoothing mechanism for the TDTO model, referred to as the continuous space TDTO (cTDTO). The newly proposed model has reported an improved perplexity over the baseline TDTO model of up to 9.2%, at history length of ten, as evaluated on the Wall Street Journal (WSJ) corpus. Also, in the Aurora-4 speech recognition N-best re-ranking task, the cTDTO outperformed the TDTO model by reducing the word error rate (WER) up to 12.9% relatively.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feedforward Sequential Memory Neural Networks without Recurrent Feedback

We introduce a new structure for memory neural networks, called feedforward sequential memory networks (FSMN), which can learn long-term dependency without using recurrent feedback. The proposed FSMN is a standard feedforward neural networks equipped with learnable sequential memory blocks in the hidden layers. In this work, we have applied FSMN to several language modeling (LM) tasks. Experime...

متن کامل

بررسی کارایی روش‌های مختلف هوش مصنوعی و روش آماری در برآورد میزان رواناب (مطالعه موردی: حوزه شهید نوری کاخک گناباد)

Rainfall-runoff models are used in the field of hydrology and runoff estimation for many years, but despite existing numerous models, the regular release of new models shows that there is still not a model that can provide sophisticated estimations with high accuracy and performance. In order to achieve the best results, modeling and identification of factors affecting the output of the model i...

متن کامل

Compact Feedforward Sequential Memory Networks for Large Vocabulary Continuous Speech Recognition

In acoustic modeling for large vocabulary continuous speech recognition, it is essential to model long term dependency within speech signals. Usually, recurrent neural network (RNN) architectures, especially the long short term memory (LSTM) models, are the most popular choice. Recently, a novel architecture, namely feedforward sequential memory networks (FSMN), provides a non-recurrent archite...

متن کامل

Residual Memory Networks in Language Modeling: Improving the Reputation of Feed-Forward Networks

We introduce the Residual Memory Network (RMN) architecture to language modeling. RMN is an architecture of feedforward neural networks that incorporates residual connections and time-delay connections that allow us to naturally incorporate information from a substantial time context. As this is the first time RMNs are applied for language modeling, we thoroughly investigate their behaviour on ...

متن کامل

Stochastic Dropout: Activation-level Dropout to Learn Better Neural Language Models

Recurrent Neural Networks are very powerful computational tools that are capable of learning many tasks across different domains. However, it is prone to overfitting and can be very difficult to regularize. Inspired by Recurrent Dropout [1] and Skip-connections [2], we describe a new and simple regularization scheme: Stochastic Dropout. It resembles the structure of recurrent dropout, but offer...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015